15 research outputs found

    TV white space and LTE network optimization toward energy efficiency in suburban and rural scenarios

    Get PDF
    The radio spectrum is a limited resource. Demand for wireless communication services is increasing exponentially, stressing the availability of radio spectrum to accommodate new services. TV white space (TVWS) technologies allow a dynamic usage of the spectrum. These technologies provide wireless connectivity, in the channels of the very high frequency and ultra high frequency television broadcasting bands. In this paper, we investigate and compare the coverage range, network capacity, and network energy efficiency for TVWS technologies and LTE. We consider Ghent, Belgium, and Boyeros, Havana, Cuba, to evaluate a realistic outdoor suburban and rural area, respectively. The comparison shows that TVWS networks have an energy efficiency 9-12 times higher than LTE networks

    Multi-objective optimization of cognitive radio networks

    Get PDF
    New generation networks, based on Cognitive Radio technology, allow dynamic allocation of the spectrum, alleviating spectrum scarcity. These networks also have a resilient potential for dynamic operation for energy saving. In this paper, we present a novel wireless network optimization algorithm for cognitive radio networks based on a cloud sharing-decision mechanism. Three Key Performance Indicators (KPIs) were optimized: spectrum usage, power consumption, and exposure. For a realistic suburban scenario in Ghent city, Belgium, we determine the optimal trade-off between the KPIs. Compared to a traditional Cognitive Radio network design, our optimization algorithm for the cloud-based architecture reduced the network power consumption by 27.5%, the average global exposure by 34.3%, and spectrum usage by 34.5% at the same time. Even for the worst-case optimization (worst achieved result of a single KPI), our solution performs better than the traditional architecture by 4.8% in terms of network power consumption, 7.3% in terms of spectrum usage, and 4.3% in terms of global exposure

    IoT-based management platform for real-time spectrum and energy optimization of broadcasting networks

    Get PDF
    We investigate the feasibility of Internet of Things (IoT) technology to monitor and improve the energy efficiency and spectrum usage efficiency of broadcasting networks in the Ultra-High Frequency (UHF) band. Traditional broadcasting networks are designed with a fixed radiated power to guarantee a certain service availability. However, excessive fading margins often lead to inefficient spectrum usage, higher interference, and power consumption. We present an IoT-based management platform capable of dynamically adjusting the broadcasting network radiated power according to the current propagation conditions. We assess the performance and benchmark two IoT solutions (i.e., LoRa and NB-IoT). By means of the IoT management platform the broadcasting network with adaptive radiated power reduces the power consumption by 15% to 16.3% and increases the spectrum usage efficiency by 32% to 35% (depending on the IoT platform). The IoT feedback loop power consumption represents less than 2% of the system power consumption. In addition, white space spectrum availability for secondary wireless telecommunications services is increased by 34% during 90% of the time

    Emulation of a dynamic broadcasting network with adaptive radiated power in a real scenario

    Get PDF
    Broadcasting networks are an efficient means for delivering media content to a high density of users, because their operational cost is almost independent of the size of their audience for a given coverage area. However, when the propagation conditions are better than the worst-case design, the energy efficiency is suboptimal. In this paper, we present the results of a trial to emulate the performance of a dynamic broadcasting network with adaptive radiated power in a real broadcasting scenario. We assess the radiated power of the broadcasting network in a Cuban environment by means of a monitoring device. The power consumption of the dynamic broadcasting network with adaptive radiated power is assessed and compared with traditional broadcasting for different implementation margins. To emulate the performance of the dynamic broadcasting network with adaptive radiated power, we consider a commercial Digital Terrestrial Multimedia Broadcast (DTMB) transmitter in Havana, Cuba. Testbed hardware is designed and developed to measure the fading with a commercial receiver and emulate the signal reception under adaptive power conditions. The dynamic broadcasting network performance is assessed following the general guidelines and techniques for the evaluation of digital terrestrial television broadcasting systems recommended in the ITU-R BT. 2035-2 report

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Dynamic Interference Optimization in Cognitive Radio Networks for Rural and Suburban Areas

    Get PDF
    In this paper, we investigate the coexistence of cognitive radio networks on TV white spaces for rural and suburban connectivity. Although experimental models and laboratory measurements defined the maximum interference threshold for TV white space technologies for general use cases, our research found that in real wireless rural and suburban scenarios, severe interference to the broadcasting services might occur. This is particularly relevant when the traffic load of the telecom base stations (BSs) exceeds 80% of their maximum capacity. We propose a dynamic management algorithm for minimizing the interference, based on a centralized access control architecture for cognitive radio wireless networks. In an experimental emulation for assessing the impact of cognitive radio interference on the broadcasting service’s QoE, our method reduced the perceived video distortion by the broadcasting users by at least 50% and 27.5% in a rural and suburban scenario, respectively, while the spectrum usage is increased by just 8%

    Experimental assessment of new generation radio networks based on layered division multiplexing

    No full text
    Despite the recent advances in broadband penetration and accessibility, broadcasting networks continue to be the most efficient way for delivering media content to large areas independently of the user density. Nevertheless, a convergence of broadcasting services into the broadband networks is foreseen. In this context, Layered Division Multiplexing (LDM) allows the joint provision of unicast, multicast, and broadcast services in mobile cells infrastructure. Despite the success of LDM in broadcasting infrastructure, its practical application in Long Term Evolution networks providing Evolved Multimedia Broadcast Multicast Services (LTE eMBMS) and 5G-MBMS is more difficult. In this paper, we experimentally quantify the cell-interference and its impact on the network performance and Quality of Service. Our experiments reveal that the inter-cell interference margin for LTE eMBMS is about 3 dB higher compared to traditional LTE networks. If a layered architecture is incorporated this higher inter-cell interference would cause a reduction of approximately 19% of the Enhanced Layer (Lower Layer) coverage

    Radio environment map of an LTE deployment based on machine learning estimation of signal levels

    No full text
    Accurate estimation of Propagation Path Loss is important for reliable and optimized coverage of a service. In literature, a diversity of theoretically or experimentally based propagation models have been documented to estimate the received signal level. The goal of this work is to estimate the effective coverage area of service, predict the Path Loss, and build a Radio Environment Map (REM) using a sensor network. To this end, a sensor's correlation area is defined. By using Machine Learning (ML), the received signal level variation in this area can be estimated correctly 92.3% of the time, with a Mean Absolute Error (MAE) of 1.57 dB. Finally, a proper distribution of sensors based on the correlation area, and ML tools leads to building a REM for the effective coverage area. This approach is applied to a Long-Term Evolution network

    Indoor genetic algorithm-based 5G network planning using a machine learning model for path loss estimation

    No full text
    Accurate wireless network planning is crucial for the deployment of new wireless services. This usually requires the consecutive evaluation of many candidate solutions, which is only feasible for simple path loss models, such as one-slope models or multi-wall models. However, such path loss models are quite straightforward and often do not deliver satisfactory estimations, eventually impacting the quality of the proposed network deployment. More advanced models, such as Indoor Dominant Path Loss models, are usually more accurate, but as their path loss calculation is much more time-consuming, it is no longer possible to evaluate a large set of candidate deployment solutions. Out of necessity, a heuristic network planning algorithm is then typically used, but the outcomes heavily depend on the quality of the heuristic. Therefore, this paper investigates the use of Machine Learning to approximate a complex 5G path loss model. The much lower calculation time allows using this model in a Genetic Algorithm-based network planning algorithm. The Machine Learning model is trained for two buildings and is validated on three other buildings, with a Mean Absolute Error below 3 dB. It is shown that the new approach is able to find a wireless network deployment solution with an equal, or smaller, amount of access points, while still providing the required coverage for at least 99.4% of the receiver locations and it does this 15 times faster. Unlike a heuristic approach, the proposed one also allows accounting for additional design criteria, such as maximal average received power throughout the building, or minimal exposure to radiofrequency signals in certain rooms
    corecore